2 research outputs found

    Empirical Evaluation and Architecture Design for Big Monitoring Data Analysis

    Get PDF
    Today, monitoring data analysis is an underrated but important process in tech industries. Almost, every industry gathers and analyzes monitoring data to improve offered services or to predict critical issues in advance. However, the monitoring data constitutes V's of big data (i.e. Volume, Variety, Velocity, Value, and Veracity). Exploration of big monitoring data possess� several issues and challenges. Firstly, a wide range of monitoring data analysis tools are available and these tools offer a variety of features (i.e. functional and non-functional) that affect the analysis process. However, these features come with their own setbacks. Therefore, selection of a suitable monitoring data tools is challenging and difficult to decide. Secondly, the big monitoring data analysis process contains two main operations of querying and processing a large amount of data. Since the volume of monitoring data is big, these operations require a scalable and reliable architecture to extract, aggregate and analyze data in an arbitrary range of granularity. Ultimately, the results of analysis form the knowledge of the system and should be shared and communicated. The contribution of this research study is two-fold. Firstly, we propose a generic performance evaluation methodology. The method uses the Design of Experiment (DoE) evaluation method for the assessment of tools, workflows and techniques. The evaluation results generated from this methodology provide a base for selection. Secondly, we designed and implemented a big monitoring data analysis architecture to provide advanced analytics such as workload forecasting and pattern matching. The architecture offers these services in an available and scalable environment. We implement our design using distributed tools such as Apache Solr, Apache Hadoop and Apache Spark. We also assessed the performance aspects (i.e. Latency and Fault-tolerance) of the architecture design using the proposed evaluation method

    Evaluation of data mining tools for telecommunication monitoring data using design of experiment

    No full text
    Telecommunication monitoring data requires the automation of data analysis workflows. A data mining tool provides data workflow management systems to process and perform analysis tasks. This paper presents an evaluation of two example data mining tools following the principles of design of experiment (DOE) to run forecasting and clustering workflows for telecom monitoring data. We conduct both quantitative and qualitative evaluation on datasets collected from a trial mobile network. The datasets consist of 1 month, six months, one year and two years of time frames that provide the average number of connected users per cell on base stations. The observations from this evaluation provide insights of each data mining tool in the context of data analysis workflows. This documented design of experiment will further facilitate replicating this evaluation study and evaluate other data mining tools
    corecore